# Distilled model

OPENCLIP SigLIP Tiny 14 Distill SigLIP 400m Cc9m
MIT
A lightweight vision-language model based on the SigLIP architecture, extracting knowledge from the larger SigLIP-400m model through distillation techniques, suitable for zero-shot image classification tasks.
Image Classification
O
PumeTu
30
0
BERTA
MIT
BERTA is obtained by distilling the embedding vectors of the FRIDA model into LaBSE-ru-turbo, which is used to calculate the embedding vectors of Russian and English sentences and supports multiple prefix tasks.
Text Embedding Transformers Supports Multiple Languages
B
sergeyzh
7,089
12
FLUX.1 Lite GGUF
Other
Flux.1 Lite is a distilled 8-billion parameter Transformer model derived from FLUX.1-dev, optimized for text-to-image generation tasks, reducing memory usage while maintaining accuracy and improving speed.
Text-to-Image
F
gpustack
5,452
3
Kotoba Whisper V1.0
Apache-2.0
Kotoba-Whisper is a Japanese automatic speech recognition distilled Whisper model collection jointly developed by Asahi Ushio and Kotoba Technologies, which is 6.3 times faster than the original large-v3 while maintaining similar low error rates.
Speech Recognition Transformers Japanese
K
kotoba-tech
2,397
53
LCM Dreamshaper V7
MIT
Distilled from the fine-tuned version of Stable-Diffusion v1-5's Dreamshaper v7, capable of generating high-quality images in extremely short inference time
Image Generation English
L
ckpt
190
3
Indictrans2 En Indic Dist 200M
MIT
IndicTrans2 is a high-quality machine translation model supporting translation between English and 22 Indian languages. This version is a distilled 200M parameter model
Machine Translation Transformers Supports Multiple Languages
I
ai4bharat
4,461
12
Nllb 200 Distilled 600M Dz To En
This model is a fine-tuned Arabic (Dz) to English translation model based on the distilled version of NLLB-200
Machine Translation Transformers
N
KarmaCST
17
0
Small100
MIT
SMaLL-100 is a compact and fast large-scale multilingual machine translation model, covering over 10,000 language pairs, with performance comparable to M2M-100 but smaller in size and faster.
Machine Translation Transformers Supports Multiple Languages
S
alirezamsh
5,374
81
Distilbert Base Pl Cased
Apache-2.0
This is a customized compact version of distilbert-base-multilingual-cased, specifically optimized for Polish while retaining the original model's accuracy.
Large Language Model Transformers Other
D
Geotrend
92
1
Distilbert Base En It Cased
Apache-2.0
This is a lightweight version of distilbert-base-multilingual-cased, specifically optimized for English and Italian, retaining the original model's accuracy.
Large Language Model Transformers Other
D
Geotrend
20
0
Distilbert Base En Fr Es Pt It Cased
Apache-2.0
This is a lightweight version of distilbert-base-multilingual-cased, supporting English, French, Spanish, Portuguese, and Italian language processing.
Large Language Model Transformers Supports Multiple Languages
D
Geotrend
24
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase